I looked into the issue and it also seem to be strongly centring around cryogenic brain preservation, with advocacy of slowing down technological progress. I’ll go cynical and say that the reason they suddenly care so much for future is that they want to live forever safer. Never mind all the people who lack shelter and food and water right now, and protection from other people, never mind that while the progress got us there, if it is to slow down things will get even worse.
It does appear to me that the SIAI advocates slowing down technological progress by the rest of the world.
They would rather the progress happens within the SIAI—so they can “win”—and thus SAVE THE WORLD.
It also says:
The practical issue with such AI fear mongering is that it raises the probability of unabomber-like incidents.
Maybe. The associated negative marketing isn’t a great sign either. I don’t think Eliezer Yudkowsky had a good basis for claiming that “Novamente *would* destroy the world if it worked”. It just seems like bad-mouthing a competitor’s product to me. Using negative marketing isn’t a great sign. I think it is best to stick to the established facts about competitor’s products.
Here is a blog post by the O.P. (Dmytry) - on the topic of Futurism and Artificial Intelligence. It says:
It does appear to me that the SIAI advocates slowing down technological progress by the rest of the world.
They would rather the progress happens within the SIAI—so they can “win”—and thus SAVE THE WORLD.
It also says:
Maybe. The associated negative marketing isn’t a great sign either. I don’t think Eliezer Yudkowsky had a good basis for claiming that “Novamente *would* destroy the world if it worked”. It just seems like bad-mouthing a competitor’s product to me. Using negative marketing isn’t a great sign. I think it is best to stick to the established facts about competitor’s products.